Regularized Least Square Regression with Unbounded and Dependent Sampling

نویسندگان

  • Xiaorong Chu
  • Hongwei Sun
  • Changbum Chun
چکیده

and Applied Analysis 3 Theorem 4. Suppose that the unbounded hypothesis with p > 2 holds, L−r K f ρ ∈ L 2 ρX (X) for some r > 0, and theα-mixing coefficients satisfy a polynomial decay, that is, α l ≤ bl −t for some b > 0 and t > 0. Then, for any 0 < η < 1, one has with confidence 1 − η, 󵄩 󵄩 󵄩 󵄩 󵄩 fz,γ − ρ 󵄩 󵄩 󵄩 󵄩 󵄩ρX = O(m −θmin{(p−2)t/p,1} (logm)1/2) , (13) where θ is given by θ = { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { { {

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularized Least Square Regression with Spherical Polynomial Kernels

This article considers regularized least square regression on the sphere. It develops a theoretical analysis of the generalization performances of regularized least square regression algorithm with spherical polynomial kernels. The explicit bounds are derived for the excess risk error. The learning rates depend on the eigenvalues of spherical polynomial integral operators and on the dimension o...

متن کامل

Concentration estimates for learning with unbounded sampling

We consider the least-square regularization schemes for regression problems in reproducing kernel Hilbert spaces. The learning algorithm is implemented with samples drawn from unbounded sampling processes. The purpose of this talk is to present concentration estimates for the error based on 2-empirical covering numbers, which improves learning rates in the literature.

متن کامل

Performance Analysis Of Regularized Linear Regression Models For Oxazolines And Oxazoles Derivitive Descriptor Dataset

Regularized regression techniques for linear regression have been created the last few ten years to reduce the flaws of ordinary least squares regression with regard to prediction accuracy. In this paper, new methods for using regularized regression in model choice are introduced, and we distinguish the conditions in which regularized regression develops our ability to discriminate models. We a...

متن کامل

Reproducing Kernel Banach Spaces with the ℓ1 Norm II: Error Analysis for Regularized Least Square Regression

A typical approach in estimating the learning rate of a regularized learning scheme is to bound the approximation error by the sum of the sampling error, the hypothesis error and the regularization error. Using a reproducing kernel space that satisfies the linear representer theorem brings the advantage of discarding the hypothesis error from the sum automatically. Following this direction, we ...

متن کامل

Convergence Rate of Coefficient Regularized Kernel-based Learning Algorithms

We investigate machine learning for the least square regression with data dependent hypothesis and coefficient regularization algorithms based on general kernels. We provide some estimates for the learning raters of both regression and classification when the hypothesis spaces are sample dependent. Under a weak condition on the kernels we derive learning error by estimating the rate of some K-f...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014